Let's say we want to prepare data and try some scalers and classifiers for prediction in a classification problem. We will tune paramaters of classifiers by grid search technique.

Data preparing:


In [1]:
from sklearn.datasets import make_classification


X, y = make_classification()
data = {'X': X, 'y': y}

Setting steps for our pipelines and parameters for grid search:


In [2]:
from reskit.core import Pipeliner


from sklearn.preprocessing import StandardScaler
from sklearn.preprocessing import MinMaxScaler

from sklearn.linear_model import LogisticRegression
from sklearn.svm import SVC


classifiers = [('LR', LogisticRegression()),
               ('SVC', SVC())]

scalers = [('standard', StandardScaler()),
           ('minmax', MinMaxScaler())]

steps = [('scaler', scalers),
         ('classifier', classifiers)]

param_grid = {'LR': {'penalty': ['l1', 'l2']},
              'SVC': {'kernel': ['linear', 'poly', 'rbf', 'sigmoid']}}

Creating a plan of our research:


In [3]:
pipe = Pipeliner(steps, param_grid=param_grid)
pipe.plan_table


Out[3]:
scaler classifier
0 standard LR
1 standard SVC
2 minmax LR
3 minmax SVC

To tune parameters of models and evaluate this models, run:


In [4]:
pipe.get_results(data=data, scoring=['roc_auc'])


Line: 1/4
Line: 2/4
Line: 3/4
Line: 4/4
Out[4]:
scaler classifier grid_roc_auc_mean grid_roc_auc_std grid_roc_auc_best_params eval_roc_auc_mean eval_roc_auc_std eval_roc_auc_scores
0 standard LR 0.987868 0.0052723 {'penalty': 'l1'} 0.987723 0.00532019 [ 0.99307958 0.98961938 0.98046875]
1 standard SVC 0.984485 0.00386297 {'kernel': 'poly'} 0.98456 0.00386095 [ 0.97923875 0.98615917 0.98828125]
2 minmax LR 0.992647 0.00635158 {'penalty': 'l1'} 0.992485 0.00639273 [ 1. 0.99307958 0.984375 ]
3 minmax SVC 0.990368 0.00202087 {'kernel': 'rbf'} 0.990327 0.00202176 [ 0.98961938 0.99307958 0.98828125]